mean field distribution
Approximating Posterior Distributions in Belief Networks Using Mixtures
Exact inference in densely connected Bayesian networks is computation(cid:173) ally intractable, and so there is considerable interest in developing effec(cid:173) tive approximation schemes. One approach which has been adopted is to bound the log likelihood using a mean-field approximating distribution. While this leads to a tractable algorithm, the mean field distribution is as(cid:173) sumed to be factorial and hence unimodal. In this paper we demonstrate the feasibility of using a richer class of approximating distributions based on mixtures of mean field distributions. We derive an efficient algorithm for updating the mixture parameters and apply it to the problem of learn(cid:173) ing in sigmoid belief networks.
Exploratory LQG Mean Field Games with Entropy Regularization
Firoozi, Dena, Jaimungal, Sebastian
We study a general class of entropy-regularized multi-variate LQG mean field games (MFGs) in continuous time with $K$ distinct sub-population of agents. We extend the notion of actions to action distributions (exploratory actions), and explicitly derive the optimal action distributions for individual agents in the limiting MFG. We demonstrate that the optimal set of action distributions yields an $\epsilon$-Nash equilibrium for the finite-population entropy-regularized MFG. Furthermore, we compare the resulting solutions with those of classical LQG MFGs and establish the equivalence of their existence.
- North America > Canada > Ontario > Toronto (0.14)
- North America > Canada > Quebec > Montreal (0.04)
- North America > United States > New York (0.04)
- Europe > Switzerland > Basel-City > Basel (0.04)
Approximating Posterior Distributions in Belief Networks Using Mixtures
Bishop, Christopher M., Lawrence, Neil D., Jaakkola, Tommi, Jordan, Michael I.
Exact inference in densely connected Bayesian networks is computationally intractable, and so there is considerable interest in developing effective approximation schemes. One approach which has been adopted is to bound the log likelihood using a mean-field approximating distribution. While this leads to a tractable algorithm, the mean field distribution is assumed to be factorial and hence unimodal. In this paper we demonstrate the feasibility of using a richer class of approximating distributions based on mixtures of mean field distributions. We derive an efficient algorithm for updating the mixture parameters and apply it to the problem of learning in sigmoid belief networks. Our results demonstrate a systematic improvement over simple mean field theory as the number of mixture components is increased.
- Asia > Middle East > Jordan (0.07)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- Europe > United Kingdom (0.04)
Approximating Posterior Distributions in Belief Networks Using Mixtures
Bishop, Christopher M., Lawrence, Neil D., Jaakkola, Tommi, Jordan, Michael I.
Exact inference in densely connected Bayesian networks is computationally intractable, and so there is considerable interest in developing effective approximation schemes. One approach which has been adopted is to bound the log likelihood using a mean-field approximating distribution. While this leads to a tractable algorithm, the mean field distribution is assumed to be factorial and hence unimodal. In this paper we demonstrate the feasibility of using a richer class of approximating distributions based on mixtures of mean field distributions. We derive an efficient algorithm for updating the mixture parameters and apply it to the problem of learning in sigmoid belief networks. Our results demonstrate a systematic improvement over simple mean field theory as the number of mixture components is increased.
- Asia > Middle East > Jordan (0.07)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- Europe > United Kingdom (0.04)
Approximating Posterior Distributions in Belief Networks Using Mixtures
Bishop, Christopher M., Lawrence, Neil D., Jaakkola, Tommi, Jordan, Michael I.
Exact inference in densely connected Bayesian networks is computationally intractable,and so there is considerable interest in developing effective approximation schemes. One approach which has been adopted is to bound the log likelihood using a mean-field approximating distribution. While this leads to a tractable algorithm, the mean field distribution is assumed tobe factorial and hence unimodal. In this paper we demonstrate the feasibility of using a richer class of approximating distributions based on mixtures of mean field distributions. We derive an efficient algorithm for updating the mixture parameters and apply it to the problem of learning insigmoid belief networks. Our results demonstrate a systematic improvement over simple mean field theory as the number of mixture components is increased.